Raven's Progressive Matrices Completion with Latent Gaussian Process Priors

نویسندگان

چکیده

Abstract reasoning ability is fundamental to human intelligence. It enables humans uncover relations among abstract concepts and further deduce implicit rules from the relations. As a well-known visual task, Raven's Progressive Matrices (RPM) are widely used in IQ tests. Although extensive research has been conducted on RPM solvers with machine intelligence, few studies have considered advancing standard answer-selection (classification) problem more challenging answer-painting (generating) problem, which can verify whether model indeed understood rules. In this paper we aim solve latter one by proposing deep latent variable model, multiple Gaussian processes employed as priors of variables separately learn underlying RPMs; thus proposed interpretable terms concept-specific variables. The process also provides an effective way extrapolation for answer painting based learned concept-changing We evaluate RPM-like datasets continuously-changing concepts. Experimental results demonstrate that our requires only training samples paint high-quality answers, generate novel panels, achieve interpretability through

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A latent variable Gaussian process model with Pitman-Yor process priors for multiclass classification

Mixtures of Gaussian processes have been considered by several researchers as a means of dealing with non-stationary covariance functions, discontinuities, multi-modality, and overlapping output signals in the context of regression tasks. In this paper, for the first time in the literature, we devise a Gaussian process mixture model especially suitable for multiclass classification applications...

متن کامل

Gaussian Process Priors with ARMA Noise Models

We extend the standard covariance function used in the Gaussian Process prior nonparametric modelling approach to include correlated (ARMA) noise models. The improvement in performance is illustrated on some simulation examples of data generated by nonlinear static functions corrupted with additive ARMA noise. 1 Gaussian Process priors In recent years many flexible parametric and semi-parametri...

متن کامل

Nonnegative Matrix Factorization with Gaussian Process Priors

We present a general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors. We assume that the nonnegative factors in the NMF are linked by a strictly increasing function to an underlying Gaussian process specified by its covariance function. This allows us to find NMF decompositions that agree with our prior knowledge of the distribu...

متن کامل

Bayesian inference with rescaled Gaussian process priors

Abstract: We use rescaled Gaussian processes as prior models for functional parameters in nonparametric statistical models. We show how the rate of contraction of the posterior distributions depends on the scaling factor. In particular, we exhibit rescaled Gaussian process priors yielding posteriors that contract around the true parameter at optimal convergence rates. To derive our results we e...

متن کامل

Mixtures of Gaussian process priors

Nonparametric Bayesian approaches based on Gaussian processes have recently become popular in the empirical learning community. They encompass many classical methods of statistics, like Radial Basis Functions or various splines, and are technically convenient because Gaussian integrals can be calculated analytically. Restricting to Gaussian processes, however, forbids for example the implementi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i11.17157